204 research outputs found

    Investigating established EEG parameter during real-world driving

    Get PDF
    © 2018 Protzak and Gramann. In real life, behavior is influenced by dynamically changing contextual factors and is rarely limited to simple tasks and binary choices. For a meaningful interpretation of brain dynamics underlying more natural cognitive processing in active humans, ecologically valid test scenarios are essential. To understand whether brain dynamics in restricted artificial lab settings reflect the neural activity in complex natural environments, we systematically tested the auditory event-related P300 in both settings. We developed an integrative approach comprising an initial P300-study in a highly controlled laboratory set-up and a subsequent validation within a realistic driving scenario. Using a simulated dialog with a speech-based input system, increased P300 amplitudes reflected processing of infrequent and incorrect auditory feedback events in both the laboratory setting and the real world setup. Environmental noise and movement-related activity in the car driving scenario led to higher data rejection rates but revealed comparable theta and alpha frequency band pattern. Our results demonstrate the possibility to investigate cognitive functions like context updating in highly artifact prone driving scenarios and encourage the consideration of more realistic task settings in prospective brain imaging approaches

    Single‐trial regression of spatial exploration behavior indicates posterior EEG alpha modulation to reflect egocentric coding

    Full text link
    Learning to navigate uncharted terrain is a key cognitive ability that emerges as a deeply embodied process, with eye movements and locomotion proving most useful to sample the environment. We studied healthy human participants during active spatial learning of room-scale virtual reality (VR) mazes. In the invisible maze task, participants wearing a wireless electroencephalography (EEG) headset were free to explore their surroundings, only given the objective to build and foster a mental spatial representation of their environment. Spatial uncertainty was resolved by touching otherwise invisible walls that were briefly rendered visible inside VR, similar to finding your way in the dark. We showcase the capabilities of mobile brain/body imaging using VR, demonstrating several analysis approaches based on general linear models (GLMs) to reveal behavior-dependent brain dynamics. Confirming spatial learning via drawn sketch maps, we employed motion capture to image spatial exploration behavior describing a shift from initial exploration to subsequent exploitation of the mental representation. Using independent component analysis, the current work specifically targeted oscillations in response to wall touches reflecting isolated spatial learning events arising in deep posterior EEG sources located in the retrosplenial complex. Single-trial regression identified significant modulation of alpha oscillations by the immediate, egocentric, exploration behavior. When encountering novel walls, as well as with increasing walking distance between subsequent touches when encountering novel walls, alpha power decreased. We conclude that these oscillations play a prominent role during egocentric evidencing of allocentric spatial hypotheses

    Overcoming Spatial Deskilling Using Landmark-Based Navigation Assistance Systems

    Full text link
    Abstract Background The repeated use of navigation assistance systems leads to decreased spatial orienting abilities. Previous studies demonstrated that augmentation of landmarks using auditory navigation instructions can improve incidental spatial learning when driving on a single route through an unfamiliar environment. Objective Based on these results, a series of experiments was conducted to further investigate both the impairment of spatial knowledge acquisition by standard navigation instructions and the positive impact of landmark augmentation in auditory navigation instructions on incidental spatial learning. Method The first Experiment replicated the previous setup in a driving simulator without additional visual route indicators. In a second experiment, spatial knowledge was tested after watching a video depicting assisted navigation along a real-world urban route. Finally, a third Experiment investigated incidental spatial knowledge acquisition when participants actively navigated through an unrestricted real-world,urban environment. Results All three experiments demonstrated better cued-recall performance for participants navigating with landmark-based auditory navigation instructions as compared to standard instructions. Notably, standard instructions were associated with reduced learning of landmarks at navigation relevant intersections as compared to landmarks alongside straight segments and the recognition of novel landmarks. Conclusion The results revealed a suppression of spatial learning by established navigation instructions, which were overcome by landmark-based navigation instructions. This emphasizes the positive impact of auditory landmark augmentation on incidental spatial learning and its generalizability to real-life settings. Application This research is paving the way for navigation assistants that, instead of impairing orienting abilities, incidentally foster spatial learning during every-day navigation. Précis This series of three experiments replicates the suppression of spatial learning by standard navigation instructions and the positive impact of landmark augmentation in auditory navigation instructions on incidental spatial learning during assisted navigation. Three experiments with growing degree of realism revealed the applicability and generalizability to real-life settings

    Mobile brain/body imaging (MoBI) of physical interaction with dynamically moving objects

    Get PDF
    © 2016 Jungnickel and Gramann. The non-invasive recording and analysis of human brain activity during active movements in natural working conditions is a central challenge in Neuroergonomics research. Existing brain imaging approaches do not allow for an investigation of brain dynamics during active behavior because their sensors cannot follow the movement of the signal source. However, movements that require the operator to react fast and to adapt to a dynamically changing environment occur frequently in working environments like assembly-line work, construction trade, health care, but also outside the working environment like in team sports. Overcoming the restrictions of existing imaging methods would allow for deeper insights into neurocognitive processes at workplaces that require physical interactions and thus could help to adapt work settings to the user. To investigate the brain dynamics accompanying rapid volatile movements we used a visual oddball paradigm where participants had to react to color changes either with a simple button press or by physically pointing towards a moving target. Using a mobile brain/body imaging approach (MoBI) including independent component analysis (ICA) with subsequent backprojection of cluster activity allowed for systematically describing the contribution of brain and non-brain sources to the sensor signal. The results demonstrate that visual event-related potentials (ERPs) can be analyzed for simple button presses and physical pointing responses and that it is possible to quantify the contribution of brain processes, muscle activity and eye movements to the signal recorded at the sensor level even for fast volatile arm movements with strong jerks. Using MoBI in naturalistic working environments can thus help to analyze brain dynamics in natural working conditions and help improving unhealthy or inefficient work settings

    Modified navigation instructions for spatial navigation assistance systems lead to incidental spatial learning

    Get PDF
    © 2017 Gramann, Hoepner and Karrer-Gauss. Spatial cognitive skills deteriorate with the increasing use of automated GPS navigation and a general decrease in the ability to orient in space might have further impact on independence, autonomy, and quality of life. In the present study we investigate whether modified navigation instructions support incidental spatial knowledge acquisition. A virtual driving environment was used to examine the impact of modified navigation instructions on spatial learning while using a GPS navigation assistance system. Participants navigated through a simulated urban and suburban environment, using navigation support to reach their destination. Driving performance as well as spatial learning was thereby assessed. Three navigation instruction conditions were tested: (i) a control group that was provided with classical navigation instructions at decision points, and two other groups that received navigation instructions at decision points including either (ii) additional irrelevant information about landmarks or (iii) additional personally relevant information (i.e., individual preferences regarding food, hobbies, etc.), associated with landmarks. Driving performance revealed no differences between navigation instructions. Significant improvements were observed in both modified navigation instruction conditions on three different measures of spatial learning and memory: subsequent navigation of the initial route without navigation assistance, landmark recognition, and sketch map drawing. Future navigation assistance systems could incorporate modified instructions to promote incidental spatial learning and to foster more general spatial cognitive abilities. Such systems might extend mobility across the lifespan

    The brain dynamics of architectural affordances during transition.

    Full text link
    Action is a medium of collecting sensory information about the environment, which in turn is shaped by architectural affordances. Affordances characterize the fit between the physical structure of the body and capacities for movement and interaction with the environment, thus relying on sensorimotor processes associated with exploring the surroundings. Central to sensorimotor brain dynamics, the attentional mechanisms directing the gating function of sensory signals share neuronal resources with motor-related processes necessary to inferring the external causes of sensory signals. Such a predictive coding approach suggests that sensorimotor dynamics are sensitive to architectural affordances that support or suppress specific kinds of actions for an individual. However, how architectural affordances relate to the attentional mechanisms underlying the gating function for sensory signals remains unknown. Here we demonstrate that event-related desynchronization of alpha-band oscillations in parieto-occipital and medio-temporal regions covary with the architectural affordances. Source-level time-frequency analysis of data recorded in a motor-priming Mobile Brain/Body Imaging experiment revealed strong event-related desynchronization of the alpha band to originate from the posterior cingulate complex, the parahippocampal region as well as the occipital cortex. Our results firstly contribute to the understanding of how the brain resolves architectural affordances relevant to behaviour. Second, our results indicate that the alpha-band originating from the occipital cortex and parahippocampal region covaries with the architectural affordances before participants interact with the environment, whereas during the interaction, the posterior cingulate cortex and motor areas dynamically reflect the affordable behaviour. We conclude that the sensorimotor dynamics reflect behaviour-relevant features in the designed environment

    EEG correlates of spatial orientation in the human retrosplenial complex

    Full text link
    © 2015 Elsevier Inc. Studies on spatial navigation reliably demonstrate that the retrosplenial complex (RSC) plays a pivotal role for allocentric spatial information processing by transforming egocentric and allocentric spatial information into the respective other spatial reference frame (SRF). While more and more imaging studies investigate the role of the RSC in spatial tasks, high temporal resolution measures such as electroencephalography (EEG) are missing. To investigate the function of the RSC in spatial navigation with high temporal resolution we used EEG to analyze spectral perturbations during navigation based on allocentric and egocentric SRF. Participants performed a path integration task in a clearly structured virtual environment providing allothetic information. Continuous EEG recordings were decomposed by independent component analysis (ICA) with subsequent source reconstruction of independent time source series using equivalent dipole modeling. Time-frequency transformation was used to investigate reference frame-specific orientation processes during navigation as compared to a control condition with identical visual input but no orientation task. Our results demonstrate that navigation based on an egocentric reference frame recruited a network including the parietal, motor, and occipital cortices with dominant perturbations in the alpha band and theta modulation in frontal cortex. Allocentric navigation was accompanied by performance-related desynchronization of the 8-13. Hz frequency band and synchronization in the 12-14. Hz band in the RSC. The results support the claim that the retrosplenial complex is central to translating egocentric spatial information into allocentric reference frames. Modulations in different frequencies with different time courses in the RSC further provide first evidence of two distinct neural processes reflecting translation of spatial information based on distinct reference frames and the computation of heading changes

    Familiarity with speech affects cortical processing of auditory distance cues and increases acuity

    Get PDF
    Several acoustic cues contribute to auditory distance estimation. Nonacoustic cues, including familiarity, may also play a role. We tested participants' ability to distinguish the distances of acoustically similar sounds that differed in familiarity. Participants were better able to judge the distances of familiar sounds. Electroencephalographic (EEG) recordings collected while participants performed this auditory distance judgment task revealed that several cortical regions responded in different ways depending on sound familiarity. Surprisingly, these differences were observed in auditory cortical regions as well as other cortical regions distributed throughout both hemispheres. These data suggest that learning about subtle, distance-dependent variations in complex speech sounds involves processing in a broad cortical network that contributes both to speech recognition and to how spatial information is extracted from speech. © 2012 Wisniewski et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

    Sensory-motor brain dynamics reflect architectural affordances

    Full text link
    Anticipating meaningful actions in the environment is an essential function of the brain. Such predictive mechanisms originate from the motor system and allow for inferring actions from environmental affordances, the potential to act within a specific environment. Using architecture, we provide a unique perspective to the abiding debate in cognitive neuroscience and philosophy on whether cognition depends on movement or is decoupled from our physical structure. To investigate cognitive processes associated with architectural affordances, we used a Mobile Brain/Body Imaging approach recording brain activity synchronized to head-mounted virtual reality. Participants perceived and acted upon virtual transitions ranging from non-passable to easily passable. We demonstrate that early sensory brain activity, upon revealing the environment and before actual movement, differed as a function of affordances. Additionally, movement through transitions was preceded by a motor-related negative component also depended on affordances. Our results suggest that potential actions afforded by an environment influence perception
    corecore